Log Likelihood Spectral Distance, Entropy Rate Power, and Mutual Information with Applications to Speech Coding
نویسندگان
چکیده
We provide a new derivation of the log likelihood spectral distance measure for signal processing using the logarithm of the ratio of entropy rate powers. Using this interpretation, we show that the log likelihood ratio is equivalent to the difference of two differential entropies, and further that it can be written as the difference of two mutual informations. These latter two expressions allow the analysis of signals via the log likelihood ratio to be extended beyond spectral matching to the study of their statistical quantities of differential entropy and mutual information. Examples from speech coding are presented to illustrate the utility of these new results. These new expressions allow the log likelihood ratio to be of interest in applications beyond those of just spectral matching for speech.
منابع مشابه
A generalization of the entropy power inequality with applications
We prove the following generalization of the Entropy Power Inequality: h(Ax) h(A~ x) where h() denotes (joint-) diierential-entropy, x = x 1 : : : x n is a random vector with independent components, ~ x = ~ x 1 : : : ~ x n is a Gaussian vector with independent components such that h(~ x i) = h(x i), i = 1 : : : n, and A is any matrix. This generalization of the entropy-power inequality is appli...
متن کاملDiscriminative maximum entropy language model for speech recognition
This paper presents a new discriminative language model based on the whole-sentence maximum entropy (ME) framework. In the proposed discriminative ME (DME) model, we exploit an integrated linguistic and acoustic model, which properly incorporates the features from n-gram model and acoustic log likelihoods of target and competing models. Through the constrained optimization of integrated model, ...
متن کاملSpeech separation based on the GMM PDF estimation
In this paper, the speech separation task will be regarded as a convolutive mixture Blind Source Separation (BSS) problem. The Maximum Entropy (ME) algorithm, the Minimum Mutual Information (MMI) algorithm and the Maximum Likelihood (ML) algorithm are main approaches of the algorithms solving the BSS problem. The relationship of these three algorithms has been analyzed in this paper. Based on t...
متن کاملGeneralization of Information Measures
| General formulas for entropy, mutual information, and divergence are established. It is revealed that these quantities are actually determined by three decisive sequences of random variables; which are, respectively, the normalized source information density, the normalized channel information density, and the normalized log-likelihood ratio. In terms of the ultimate cumulative distribution f...
متن کاملUltra low bit-rate speech coding based on unit-selection with joint spectral-residual quantization: no transmission of any residual information
A recent trend in ultra low bit-rate speech coding is based on segment quantization by unit-selection principle using large continuous codebooks as a unit database. We show that use of such large unit databases allows speech to be reconstructed at the decoder by using the best unit’s residual itself (in the unit database), thereby obviating the need to transmit any side information about the re...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Entropy
دوره 19 شماره
صفحات -
تاریخ انتشار 2017